Adaptively local I - dimensional subproblems

نویسنده

  • Jianqing Fan
چکیده

We provide a new insight of the difficulty of nonparametric estimation of a whole function. A new method is invented for finding a minimax lower bound of globally estimating a function. The idea is to adjust automatically the direction to the nearly hardest I-dimensional subproblem at each location, and to use locally the difficulty of I-dimensional subproblem. In a variety of contexts, our method can give not only attainable global rates, but also constant factors. Comparing with the existing techniques, our method has the advantages of being easily implemented and understood, and can give constant factors as well. We illustrate the lower bound by using examples of nonparametric density estimation as well as nonparametric regression. Concise proofs of the lower rates are given. Applying our lower bound to deconvolution setting, we obtain the best attainable global rates of convergence. With the existing techniques, it would be extremely difficult to solve such a problem. oAbbreviated title. Local I-d subproblems. AMS 1980 subject classification. Primary 62G20. Secondary 62G05.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A comparison of a posteriori error estimators for mixed finite element discretizations by Raviart-Thomas elements

We consider mixed finite element discretizations of linear second order elliptic boundary value problems with respect to an adaptively generated hierarchy of possibly highly nonuniform simplicial triangulations. In particular, we present and analyze four different kinds of error estimators: a residual based estimator, a hierarchical one, error estimators relying on the solution of local subprob...

متن کامل

Linearized Alternating Direction Method with Adaptive Penalty for Low-Rank Representation

Many machine learning and signal processing problems can be formulated as linearly constrained convex programs, which could be efficiently solved by the alternating direction method (ADM). However, usually the subproblems in ADM are easily solvable only when the linear mappings in the constraints are identities. To address this issue, we propose a linearized ADM (LADM) method by linearizing the...

متن کامل

An Adaptive Evolutionary Multi-Objective Approach Based on Simulated Annealing

A multi-objective optimization problem can be solved by decomposing it into one or more single objective subproblems in some multi-objective metaheuristic algorithms. Each subproblem corresponds to one weighted aggregation function. For example, MOEA/D is an evolutionary multi-objective optimization (EMO) algorithm that attempts to optimize multiple subproblems simultaneously by evolving a popu...

متن کامل

A Differential Evolution and Spatial Distribution based Local Search for Training Fuzzy Wavelet Neural Network

Abstract   Many parameter-tuning algorithms have been proposed for training Fuzzy Wavelet Neural Networks (FWNNs). Absence of appropriate structure, convergence to local optima and low speed in learning algorithms are deficiencies of FWNNs in previous studies. In this paper, a Memetic Algorithm (MA) is introduced to train FWNN for addressing aforementioned learning lacks. Differential Evolution...

متن کامل

Experimental Study for Protection of Piers Against Local Scour Using Slots

The most important causes of bridge failure are local scour. In this study, laboratory experiments were conducted to investigate the effectiveness of slot as a protection device in reduction of depth of scour at cylindrical piers under clear water flow conditions. The development time of scour depth at the circular pier with and without a slot as a protection device was conducted. The experimen...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1989